Asymptotic description of stochastic neural networks. I. Existence of a large deviation principle

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Sub - Additivity Approach to the Conditional Large Deviation Principle

University of Chicago Given two Polish spaces AX and AY, let ρ AX × AY → d be a bounded measurable function. Let X = Xn n ≥ 1 and Y = Yn n ≥ 1 be two independent stationary processes on AX and A ∞ Y , respectively. The article studies the large deviation principle (LDP) for n−1 ∑n k=1 ρ Xk Yk , conditional on X. Based on a stochastic version of approximate subadditivity, it is shown that if Y s...

متن کامل

A large deviation principle for Dirichlet posteriors

Let Xk be a sequence of independent and identically distributed random variables taking values in a compact metric space Ω, and consider the problem of estimating the law of X1 in a Bayesian framework. A conjugate family of priors for non-parametric Bayesian inference is the Dirichlet process priors popularized by Ferguson. We prove that if the prior distribution is Dirichlet, then the sequence...

متن کامل

A large deviation principle with queueing applications

In this paper, we present a large deviation principle for partial sums processes indexed by the half line, which is particularly suited to queueing applications. The large deviation principle is established in a topology that is finer than the topology of uniform convergence on compacts and in which the queueing map is continuous. Consequently, a large deviation principle for steady-state queue...

متن کامل

A large deviation principle for Dirichlet posteriorsA

Let X k be a sequence of independent and identically distributed random variables taking values in a compact metric space , and consider the problem of estimating the law of X 1 in a Bayesian framework. A conjugate family of priors for non-parametric Bayesian inference is the Dirichlet process priors popularized by Ferguson. We prove that if the prior distribution is Dirichlet, then the sequenc...

متن کامل

A stochastic-field description of finite-size spiking neural networks

Neural network dynamics are governed by the interaction of spiking neurons. Stochastic aspects of single-neuron dynamics propagate up to the network level and shape the dynamical and informational properties of the population. Mean-field models of population activity disregard the finite-size stochastic fluctuations of network dynamics and thus offer a deterministic description of the system. H...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Comptes Rendus Mathematique

سال: 2014

ISSN: 1631-073X

DOI: 10.1016/j.crma.2014.08.018